A comparative investigation on subspace dimension determination

نویسندگان

  • Xuelei Hu
  • Lei Xu
چکیده

It is well-known that constrained Hebbian self-organization on multiple linear neural units leads to the same k-dimensional subspace spanned by the first k principal components. Not only the batch PCA algorithm has been widely applied in various fields since 1930s, but also a variety of adaptive algorithms have been proposed in the past two decades. However, most studies assume a known dimension k or determine it heuristically, though there exist a number of model selection criteria in the literature of statistics. Recently, criteria have also been obtained under the framework of Bayesian Ying-Yang (BYY) harmony learning. This paper further investigates the BYY criteria in comparison with existing typical criteria, including Akaike's information criterion (AIC), the consistent Akaike's information criterion (CAIC), the Bayesian inference criterion (BIC), and the cross-validation (CV) criterion. This comparative study is made via experiments not only on simulated data sets of different sample sizes, noise variances, data space dimensions, and subspace dimensions, but also on two real data sets from air pollution problem and sport track records, respectively. Experiments have shown that BIC outperforms AIC, CAIC, and CV while the BYY criteria are either comparable with or better than BIC. Therefore, BYY harmony learning is a more preferred tool for subspace dimension determination by further considering that the appropriate subspace dimension k can be automatically determined during implementing BYY harmony learning for the principal subspace while the selection of subspace dimension k by BIC, AIC, CAIC, and CV has to be made at the second stage based on a set of candidate subspaces with different dimensions which have to be obtained at the first stage of learning.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Isotropic Constant Dimension Subspace Codes

 In network code setting, a constant dimension code is a set of k-dimensional subspaces of F nq . If F_q n is a nondegenerated symlectic vector space with bilinear form f, an isotropic subspace U of F n q is a subspace that for all x, y ∈ U, f(x, y) = 0. We introduce isotropic subspace codes simply as a set of isotropic subspaces and show how the isotropic property use in decoding process, then...

متن کامل

Greedy Tikhonov regularization for large linear ill-posed problems

Several numerical methods for the solution of large linear ill-posed problems combine Tikhonov regularization with an iterative method based on partial Lanczos bidiagonalization of the operator. This paper discusses the determination of the regularization parameter and the dimension of the Krylov subspace for this kind of methods. A method that requires a Krylov subspace of minimal dimension is...

متن کامل

Accurate subspace tracking algorithms based on cross-space properties

In this paper, we analyse the issue of e ciently using Givens rotations to perform a more accurate SVD-based subspace tracking. We propose an alternative type of decomposition which allows a more versatile use of Givens rotations. We also show the direct e ect of the latter on the tracking error, and develop a cross-terms cancellation concept which leads to a class of high performance algorithm...

متن کامل

Order Statistics Approach to Estimation of the Dimension of the Noise Subspace

Model order selection and, in particular, determination of the dimension of the noise subspace, is an important problem in statistical signal processing. The discrete nature of the problem puts it in between detection and estimation. Standard tools from detection theory force a solution subject to arbitrary false alarm probability. On the other hand, direct maximum likelihood (ML) approach requ...

متن کامل

Accurate Subspace Tracking Algorithms Based Oncross - Space

In this paper, we analyse the issue of eeciently using Givens rotations to perform a more accurate SVD-based subspace tracking. We propose an alternative type of decomposition which allows a more versatile use of Givens rotations. We also show the direct eeect of the latter on the tracking error, and develop a cross-terms cancellation concept which leads to a class of high performance algorithm...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Neural networks : the official journal of the International Neural Network Society

دوره 17 8-9  شماره 

صفحات  -

تاریخ انتشار 2004